Jensen-Shannon divergence, Fisher information, and Wootters’ hypothesis

نویسندگان

  • M. Casas
  • P. W. Lamberti
چکیده

M. Casas, P. W. Lamberti , A. Plastino , and A. R. Plastino 4, 5 1 Department de Fisica and IMEDEA, Universitat de les Illes Balears 07122 Palma de Mallorca, Spain 2 Facultad de Matemática, Astronomı́a y F́ısica Universidad Nacional de Córdoba Ciudad Universitaria 5000 Córdoba, Argentina 3 Universidad Nacional d e La Plata, C.C. 727, 1900 La Plata, Argentina 4 Argentine National Research Center (CONICET), C. C. 727, 1900 La Plata, Argentina 5Department of Physics, University of Pretoria, Pretoria 0002, South Africa

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Jensen divergence based on Fisher's information

The measure of Jensen-Fisher divergence between probability distributions is introduced and its theoretical grounds set up. This quantity, in contrast to the remaining Jensen divergences, is very sensitive to the fluctuations of the probability distributions because it is controlled by the (local) Fisher information, which is a gradient functional of the distribution. So, it is appropriate and ...

متن کامل

A family of statistical symmetric divergences based on Jensen's inequality

We introduce a novel parametric family of symmetric information-theoretic distances based on Jensen’s inequality for a convex functional generator. In particular, this family unifies the celebrated Jeffreys divergence with the Jensen-Shannon divergence when the Shannon entropy generator is chosen. We then design a generic algorithm to compute the unique centroid defined as the minimum average d...

متن کامل

A Graph Embedding Method Using the Jensen-Shannon Divergence

Riesen and Bunke recently proposed a novel dissimilarity based approach for embedding graphs into a vector space. One drawback of their approach is the computational cost graph edit operations required to compute the dissimilarity for graphs. In this paper we explore whether the Jensen-Shannon divergence can be used as a means of computing a fast similarity measure between a pair of graphs. We ...

متن کامل

Bounds on Non-Symmetric Divergence Measures in Terms of Symmetric Divergence Measures

There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leibler [13] relative information and Jeffreys [12] Jdivergence. Sibson [17] Jensen-Shannon divergence has also found its applications in the literature. The author [20] studied a new divergence measures based on arithmetic and geometric means....

متن کامل

deBruijn identities: from Shannon, Kullback–Leibler and Fisher to generalized φ -entropies, φ -divergences and φ -Fisher informations

In this paper we propose a generalization of the usual deBruijn identity that links the Shannon differential entropy (or the Kullback–Leibler divergence) and the Fisher information (or the Fisher divergence) of the output of a Gaussian channel. The generalization makes use of φ -entropies on the one hand, and of φ -divergences (of the Csizàr class) on the other hand, as generalizations of the S...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004